Regression Discontinuity Design with Spillovers
Eric Auerbach, Yong Cai, Ahnaf Rafi
https://arxiv.org/abs/2404.06471 https://arxiv.org/pdf/2404.06471
arXiv:2404.06471v1 Announce Type: new
Abstract: Researchers who estimate treatment effects using a regression discontinuity design (RDD) typically assume that there are no spillovers between the treated and control units. This may be unrealistic. We characterize the estimand of RDD in a setting where spillovers occur between units that are close in their values of the running variable. Under the assumption that spillovers are linear-in-means, we show that the estimand depends on the ratio of two terms: (1) the radius over which spillovers occur and (2) the choice of bandwidth used for the local linear regression. Specifically, RDD estimates direct treatment effect when radius is of larger order than the bandwidth, and total treatment effect when radius is of smaller order than the bandwidth. In the more realistic regime where radius is of similar order as the bandwidth, the RDD estimand is a mix of the above effects. To recover direct and spillover effects, we propose incorporating estimated spillover terms into local linear regression -- the local analog of peer effects regression. We also clarify the settings under which the donut-hole RD is able to eliminate the effects of spillovers.
Noise misleads rotation invariant algorithms on sparse targets
Manfred K. WarmuthGoogle Inc, Wojciech Kot{\l}owskiInstitute of Computing Science, Poznan University of Technology, Poznan, Poland, Matt JonesUniversity of Colorado Boulder, Colorado, USA, Ehsan AmidGoogle Inc
https://arxiv.org/abs/2403.02697
Improving the Bit Complexity of Communication for Distributed Convex Optimization
Mehrdad Ghadiri, Yin Tat Lee, Swati Padmanabhan, William Swartworth, David Woodruff, Guanghao Ye
https://arxiv.org/abs/2403.19146
Noise misleads rotation invariant algorithms on sparse targets
Manfred K. WarmuthGoogle Inc, Wojciech Kot{\l}owskiInstitute of Computing Science, Poznan University of Technology, Poznan, Poland, Matt JonesUniversity of Colorado Boulder, Colorado, USA, Ehsan AmidGoogle Inc
https://arxiv.org/abs/2403.02697
Estimating the linear relation between variables that are never jointly observed: an application in in vivo experiments
Polina Arsenteva, Mohamed Amine Benadjaoud, Herv\'e Cardot
https://arxiv.org/abs/2403.00140
Estimation of non-uniform blur using a patch-based regression convolutional neural network (CNN)
Luis G. Varela, Laura E. Boucheron, Steven Sandoval, David Voelz, Abu Bucker Siddik
https://arxiv.org/abs/2402.07796
Training Dynamics of Multi-Head Softmax Attention for In-Context Learning: Emergence, Convergence, and Optimality
Siyu Chen, Heejune Sheen, Tianhao Wang, Zhuoran Yang
https://arxiv.org/abs/2402.19442
Online and Offline Robust Multivariate Linear Regression
Antoine Godichon-Baggioni (LPSM), Stephane S. Robin (LPSM), Laure Sansonnet (MIA Paris-Saclay, LPSM)
https://arxiv.org/abs/2404.19496
Meta-Learning with Generalized Ridge Regression: High-dimensional Asymptotics, Optimality and Hyper-covariance Estimation
Yanhao Jin, Krishnakumar Balasubramanian, Debashis Paul
https://arxiv.org/abs/2403.19720
Multi-fidelity Gaussian process surrogate modeling for regression problems in physics
Kislaya Ravi, Vladyslav Fediukov, Felix Dietrich, Tobias Neckel, Fabian Buse, Michael Bergmann, Hans-Joachim Bungartz
https://arxiv.org/abs/2404.11965
skscope: Fast Sparsity-Constrained Optimization in Python
Zezhi Wang, Jin Zhu, Peng Chen, Huiyang Peng, Xiaoke Zhang, Anran Wang, Yu Zheng, Junxian Zhu, Xueqin Wang
https://arxiv.org/abs/2403.18540